"Cheap and lazy" - Historians criticize AI history videos that have been clicked millions of times


AI-generated history videos are causing fascination and criticism on TikTok. However, historians criticize inaccuracies and warn against distorted representations.
AI-generated videos are circulating on TikTok that make a unique promise. They aim to show the perspective of people who lived in eras long past. Some of the images look as if the protagonists are filming with a cell phone.
Concerned historians are now speaking out on the BBC . The videos are very popular, but raise concerns about their historical accuracy.
Historians have criticized the inconsistencies in the videos to the BBC. Historian and archaeologist Dr. Hannah Platts, for example, describes parts of the implementation as "cheap and lazy." The expert's trained eye immediately noticed a number of errors.
Barbara Keys, professor of US history at Durham University, told the BBC that the artificially generated videos often "contain no information whatsoever". According to the report, this is not necessarily a bad thing. However, errors can distort the picture of history and, in extreme cases, the formats can be used to deliberately manipulate viewers' opinions about historical events.
Nevertheless, the experts at the BBC conclude that the AI videos have potential. They could encourage viewers to do their own research. Futurologist Thomas Druyen also takes a similar stance when dealing with artificial intelligence in general. He has been studying the influence of AI systems on our psyche since 2015.
He warns of the dangers of uncritical use for human creativity. AI must be understood as a tool to support and accelerate creative processes. But if the machine itself becomes creative, there is a risk of falling into a "stimulus-response pattern". One example of this is recommendation algorithms. They control the online behavior of people, who only react to suggested content.
FOCUS